Determining Regularization Parameters for Derivative Free Neural Learning
نویسندگان
چکیده
Derivative free optimization methods have recently gained a lot of attractions for neural learning. The curse of dimensionality for the neural learning problem makes local optimization methods very attractive; however the error surface contains many local minima. Discrete gradient method is a special case of derivative free methods based on bundle methods and has the ability to jump over many local minima. There are two types of problems that are associated with this when local optimization methods are used for neural learning. The first type of problems is initial sensitivity dependence problem – that is commonly solved by using a hybrid model. Our early research has shown that discrete gradient method combining with other global methods such as evolutionary algorithm makes them even more attractive. These types of hybrid models have been studied by other researchers also. Another less mentioned problem is the problem of large weight values for the synaptic connections of the network. Large synaptic weight values often lead to the problem of paralysis and convergence problem especially when a hybrid model is used for fine tuning the learning task. In this paper we study and analyse the effect of different regularization parameters for our objective function to restrict the weight values without compromising the classification accuracy.
منابع مشابه
Ensemble Learning and Evidence Maximization
Ensemble learning by variational free energy minimization is a tool introduced to neural networks by Hinton and van Camp in which learning is described in terms of the optimization of an ensemble of parameter vectors. The optimized ensemble is an approximation to the posterior probability distribution of the parameters. This tool has now been applied to a variety of statistical inference proble...
متن کاملAnalyzing the performance of different machine learning methods in determining the transportation mode using trajectory data
With the widespread advent of the smart phones equipping with Global Positioning System (GPS), a huge volume of users’ trajectory data was generated. To facilitate urban management and present appropriate services to users, studying these data was raised as a widespread research filed and has been developing since then. In this research, the transportation mode of users’ trajectories was identi...
متن کاملContinuous Neural Networks
This article extends neural networks to the case of an uncountable number of hidden units, in several ways. In the first approach proposed, a finite parametrization is possible, allowing gradient-based learning. While having the same number of parameters as an ordinary neural network, its internal structure suggests that it can represent some smooth functions much more compactly. Under mild ass...
متن کاملOptimisation of neural state variables estimators of two-mass drive system using the Bayesian regularization method
The paper deals with the application of neural networks for state variables estimation of the electrical drive system with an elastic joint. The torsional vibration suppression of such drive system is achieved by the application of a special control structure with a state-space controller and additional feedbacks from mechanical state variables. Signals of the torsional torque and the load-mach...
متن کاملRegularization of the Trajectories of Dynamical Systems by Adjusting Parameters
A gradient learning method to regulate the trajectories of some nonlinear chaotic systems is proposed. The method is motivated by the gradient descent learning algorithms for neural networks. It is based on two systems: dynamic optimization system and system for finding sensitivities. Numerical results of several examples are presented, which convincingly illustrate the efficiency of the method...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2005